AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Whole-word N-gram masking

# Whole-word N-gram masking

Chinese Macbert Large
Apache-2.0
MacBERT is an improved Chinese BERT model that employs M as a corrective masked language model pre-training task, alleviating the inconsistency between pre-training and fine-tuning stages.
Large Language Model Chinese
C
hfl
13.05k
42
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase